翻訳と辞書
Words near each other
・ "O" Is for Outlaw
・ "O"-Jung.Ban.Hap.
・ "Ode-to-Napoleon" hexachord
・ "Oh Yeah!" Live
・ "Our Contemporary" regional art exhibition (Leningrad, 1975)
・ "P" Is for Peril
・ "Pimpernel" Smith
・ "Polish death camp" controversy
・ "Pro knigi" ("About books")
・ "Prosopa" Greek Television Awards
・ "Pussy Cats" Starring the Walkmen
・ "Q" Is for Quarry
・ "R" Is for Ricochet
・ "R" The King (2016 film)
・ "Rags" Ragland
・ ! (album)
・ ! (disambiguation)
・ !!
・ !!!
・ !!! (album)
・ !!Destroy-Oh-Boy!!
・ !Action Pact!
・ !Arriba! La Pachanga
・ !Hero
・ !Hero (album)
・ !Kung language
・ !Oka Tokat
・ !PAUS3
・ !T.O.O.H.!
・ !Women Art Revolution


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

units of information : ウィキペディア英語版
units of information
In computing and telecommunications, a unit of information is the capacity of some standard data storage system or communication channel, used to measure the capacities of other systems and channels. In information theory, units of information are also used to measure the information contents or entropy of random variables.
The most common units are the bit, the capacity of a system which can exist in only two states, and the byte (or octet), which is equivalent to eight bits. Multiples of these units can be formed from these with the SI prefixes (power-of-ten prefixes) or the newer IEC binary prefixes (binary power prefixes). Information capacity is a dimensionless quantity.
==Primary units==

In 1928, Ralph Hartley observed a fundamental storage principle,〔Norman Abramson (1963), ''Information theory and coding''. McGraw-Hill.〕 which was further formalized by Claude Shannon in 1945: the information that can be stored in a system is proportional to the logarithm log''b'' ''N'' of the number ''N'' of possible states of that system. Changing the basis of the logarithm from ''b'' to a different number ''c'' has the effect of multiplying the value of the logarithm by a fixed constant, namely log''c'' ''N'' = (log''c'' ''b'') log''b'' ''N''.
Therefore, the choice of the basis ''b'' determines the unit used to measure information. In particular, if ''b'' is a positive integer, then the unit is the amount of information that can be stored in a system with ''b'' possible states.
When ''b'' is 2, the unit is the shannon, equal to the information content of one "bit" (a contraction of binary digit). A system with 8 possible states, for example, can store up to log28 = 3 bits of information. Other units that have been named include:
* Base ''b'' = 3: the unit is called "trit", and is equal to log2 3 (≈ 1.585) bits.〔Donald E. Knuth, ''The Art of Computer Programming'', vol.2: ''Seminumerical algorithms''.〕
* Base ''b'' = 10: the unit is called ''decimal digit'', ''hartley'', ''ban'', ''decit'', or ''dit'', and is equal to log2 10 (≈ 3.322) bits.〔〔Shanmugam (2006), ''Digital and Analog Computer Systems''.〕〔Gregg Jaeger (2007), ()Quantum information: an overview'〕〔I. Ravi Kumar (2001), ''Comprehensive Statistical Theory of Communication''.〕
* Base ''b'' = ''e'', the base of natural logarithms: the unit is called a ''nat'', ''nit'', or ''nepit'' (from Neperian), and is worth log2 ''e'' (≈ 1.443) bits.〔
The trit, ban, and nat are rarely used to measure storage capacity; but the nat, in particular, is often used in information theory, because natural logarithms are sometimes more convenient than logarithms in other bases.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「units of information」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.